Principal component analysis (PCA) will be used as a dimensionality reduction technique to find the over-arching dimensions that represent knowledge about social relationships. In this study, we will explore the dimensions that are revealed when we consider the Wish social relationships rated on a comprehensive list of dimensions from the previous literature on social relationship knowledge.
This dataset was collected from a survey hosted on mturk. The survey data was cleaned with a separate python script. A matrix was created for the average rating of social relationships on dimensions that are thought to characterize these relationships. The dimensions were all of the previous dimensions that have been proposed in the literature.
PCA will output the same number of components as there are dimension inputs. As the components are ranked by how much variance they explain, we can exclude some components which do not add much additional information.
We will use parallel analysis to indicate what the optimal number of components to include would be.
## Parallel analysis suggests that the number of factors = NA and the number of components = 2
## quartz_off_screen
## 2
## Parallel analysis suggests that the number of factors = NA and the number of components = 2
Parallel analysis indicates that having only 3 components would be optimal. But to better match the literature, and to be consistent with possible future analyses, we will include 4 components.
In studies 1 and 3A, where the relationship space only included 25 relationships, the number of components recommended by parallel analysis has been lower than when more relationships are included.
PCA with no rotation is done here to visualize the amount of variance accounted for by each component.
## quartz_off_screen
## 2
Rotations are used in principal component analyses to be able to better interpret the data. There are two main types of rotations, varimax and oblimin. Here, we will use varimax rotation, as it will maximize the component loadings so that dimensions are more strongly loaded onto a single component, rather than across components. Because of this, our resulting components may correlate with each other. Oblimin rotation results in components that are uncorrelated to each other.
## [1] "First four components account for 82.93% of the variance"
RC1 = Formality
RC2 = Activeness
RC3 = Exchange
RC4 = Valence
We have three of the same components seen in the previous studies. In the present study, the third component, Exchange, is new and describes a new feature space.